On the Q-linear Convergence of a Majorized Proximal ADMM for Convex Composite Programming and Its Applications to Regularized Logistic Regression
نویسندگان
چکیده
This paper aims to study the convergence rate of a majorized alternating direction method of multiplier with indefinite proximal terms (iPADMM) for solving linearly constrained convex composite optimization problems. We establish the Q-linear rate convergence theorem for 2-block majorized iPADMM under mild conditions. Based on this result, the convergence rate analysis of symmetric Gaussian-Seidel based majorized ADMM, which is designed for solving multi-block composite convex optimization problems, are given. We apply the majorized iPADMM to solve three types of regularized logistic regression problems: constrained regression, fused lasso and overlapping group lasso. The efficiency of majorized iPADMM are demonstrated on both simulation experiments and real data sets.
منابع مشابه
A Majorized ADMM with Indefinite Proximal Terms for Linearly Constrained Convex Composite Optimization
This paper presents a majorized alternating direction method of multipliers (ADMM) with indefinite proximal terms for solving linearly constrained 2-block convex composite optimization problems with each block in the objective being the sum of a non-smooth convex function (p(x) or q(y)) and a smooth convex function (f(x) or g(y)), i.e., minx∈X , y∈Y{p(x) + f(x) + q(y) + g(y) | A∗x + B∗y = c}. B...
متن کاملLinear Rate Convergence of the Alternating Direction Method of Multipliers for Convex Composite Programming*
In this paper, we aim to prove the linear rate convergence of the alternating direction method of multipliers (ADMM) for solving linearly constrained convex composite optimization problems. Under a mild calmness condition, which holds automatically for convex composite piecewise linear-quadratic programming, we establish the global Q-linear rate of convergence for a general semi-proximal ADMM w...
متن کاملLinear Rate Convergence of the Alternating Direction Method of Multipliers for Convex Composite Quadratic and Semi-Definite Programming
In this paper, we aim to provide a comprehensive analysis on the linear rate convergence of the alternating direction method of multipliers (ADMM) for solving linearly constrained convex composite optimization problems. Under a certain error bound condition, we establish the global linear rate of convergence for a more general semi-proximal ADMM with the dual steplength being restricted to be i...
متن کاملSymmetric ADMM with Positive-Indefinite Proximal Regularization for Linearly Constrained Convex Optimization
The proximal ADMM which adds proximal regularizations to ADMM’s subproblems is a popular and useful method for linearly constrained separable convex problems, especially its linearized case. A well-known requirement on guaranteeing the convergence of the method in the literature is that the proximal regularization must be positive semidefinite. Recently it was shown by He et al. (Optimization O...
متن کاملImproving an ADMM-like Splitting Method via Positive-Indefinite Proximal Regularization for Three-Block Separable Convex Minimization
Abstract. The augmented Lagrangian method (ALM) is fundamental for solving convex minimization models with linear constraints. When the objective function is separable such that it can be represented as the sum of more than one function without coupled variables, various splitting versions of the ALM have been well studied in the literature such as the alternating direction method of multiplier...
متن کامل